AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
SFT-DPO fine-tuning

# SFT-DPO fine-tuning

Sauerkrautlm Mixtral 8x7B GGUF
Apache-2.0
SauerkrautLM Mixtral 8X7B is a multilingual text generation model based on the Mixtral architecture. It has been fine-tuned and aligned using SFT and DPO, and supports English, German, French, Italian, and Spanish.
Large Language Model Transformers Supports Multiple Languages
S
TheBloke
403
8
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase